Minimization Problems Based on Relative $\alpha$-Entropy II: Reverse Projection
نویسندگان
چکیده
In part I of this two-part work, certain minimization problems based on a parametric family of relative entropies (denoted Iα) were studied. Such minimizers were called forward Iα-projections. Here, a complementary class of minimization problems leading to the so-called reverse Iα-projections are studied. Reverse Iα-projections, particularly on log-convex or power-law families, are of interest in robust estimation problems (α > 1) and in constrained compression settings (α < 1). Orthogonality of the power-law family with an associated linear family is first established and is then exploited to turn a reverse Iα-projection into a forward Iα-projection. The transformed problem is a simpler quasiconvex minimization subject to linear constraints.
منابع مشابه
Minimization Problems Based on a Parametric Family of Relative Entropies I: Forward Projection
Minimization problems with respect to a one-parameter family of generalized relative entropies are studied. These relative entropies, which we term relative α-entropies (denoted Iα), arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual r...
متن کاملRelative $\alpha$-Entropy Minimizers Subject to Linear Statistical Constraints
We study minimization of a parametric family of relative entropies, termed relative α-entropies (denoted Iα(P,Q)). These arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (KullbackLeibler divergence). Just like relati...
متن کاملRelative α-entropy minimizers subject to linear statistical constraints
We study minimization of a parametric family of relative entropies, termed relative α-entropies (denoted Iα(P,Q)). These arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (KullbackLeibler divergence). Just like relati...
متن کاملProjection Pursuit Through Relative Entropy Minimization
Consider a de ned density on a set of very large dimension. It is quite di cult to nd an estimate of this density from a data set. However, it is possible through a projection pursuit methodology to solve this problem. In his seminal article, Huber (see "Projection pursuit", Annals of Statistics, 1985) demonstrates the interest of his method in a very simple given case. He considers the factori...
متن کاملSolving MRF Minimization by Mirror Descent
Markov Random Fields (MRF) minimization is a well-known problem in computer vision. We consider the augmented dual of the MRF minimization problem and develop a Mirror Descent algorithm based on weighted Entropy and Euclidean Projection. The augmented dual problem consists of maximizing a non-differentiable objective function subject to simplex and linear constraints. We analyze the convergence...
متن کامل